14. NLP Application: Google Neural Machine Translation

Google Neural Machine Translation

The best demonstration of an application is by looking at real-world systems that are in production right now. In late 2016, Google released the following paper describing Google’s Neural Machine Translation System:

Google’s Neural Machine Translation System: Bridging the Gap between Human and Machine Translation [pdf]

This system later went into production powering up Google Translate.

Take a stab at reading the paper and connecting it to what we've discussed in this lesson so far. Below are a few questions to guide this external reading:

  • Is the Google’s Neural Machine Translation System a sequence-to-sequence model?
  • Does the model utilize attention?
  • If the model does use attention, does it use additive or multiplicative attention?
  • What kind of RNN cell does the model use?
  • Does the model use bidirectional RNNs at all?

Text Summarization:

Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond